1,839 research outputs found

    Entanglement between a telecom photon and an on-demand multimode solid-state quantum memory

    Full text link
    Entanglement between photons at telecommunication wavelengths and long-lived quantum memories is one of the fundamental requirements of long-distance quantum communication. Quantum memories featuring on-demand read-out and multimode operation are additional precious assets that will benefit the communication rate. In this work we report the first demonstration of entanglement between a telecom photon and a collective spin excitation in a multimode solid-state quantum memory. Photon pairs are generated through widely non-degenerate parametric down-conversion, featuring energy-time entanglement between the telecom-wavelength idler and a visible signal photon. The latter is stored in a Pr3+^{3+}:Y2_2SiO5_5 crystal as a spin wave using the full Atomic Frequency Comb scheme. We then recall the stored signal photon and analyze the entanglement using the Franson scheme. We measure conditional fidelities of 92(2)%92(2)\% for excited-state storage, enough to violate a CHSH inequality, and 77(2)%77(2)\% for spin-wave storage. Taking advantage of the on-demand read-out from the spin state, we extend the entanglement storage in the quantum memory for up to 47.7~μ\mus, which could allow for the distribution of entanglement between quantum nodes separated by distances of up to 10 km

    Biophotons and emergence of quantum coherence : a diffusion entropy analysis

    Get PDF
    We study the emission of photons from germinating seeds using an experimental technique designed to detect light of extremely small intensity. We analyze the dark count signal without germinating seeds as well as the photon emission during the germination process. The technique of analysis adopted here, called diffusion entropy analysis (DEA) and originally designed to measure the temporal complexity of astrophysical, sociological and physiological processes, rests on Kolmogorov complexity. The updated version of DEA used in this paper is designed to determine if the signal complexity is generated either by non-ergodic crucial events with a non-stationary correlation function or by the infinite memory of a stationary but non-integrable correlation function or by a mixture of both processes. We find that dark count yields the ordinary scaling, thereby showing that no complexity of either kinds may occur without any seeds in the chamber. In the presence of seeds in the chamber anomalous scaling emerges, reminiscent of that found in neuro-physiological processes. However, this is a mixture of both processes and with the progress of germination the non-ergodic component tends to vanish and complexity becomes dominated by the stationary infinite memory. We illustrate some conjectures ranging from stress induced annihilation of crucial events to the emergence of quantum coherence

    Simulations of Hot Bubbles in the ICM

    Full text link
    We review the general properties of the intracluster medium (ICM) in clusters that host a cooling flow, and in particular the effects on the ICM of the injection of hot plasma by a powerful active galactic nucleus (AGN). It is observed that, in some cases, the hot plasma produces cavities in the ICM that finally detach and rise, perhaps buoyantly. The gas dynamics induced by the rising bubbles can help explain the absence of a cooled gas component in clusters with a cooling flow. This scenario is explored using numerical simulations.Comment: 13 pages, no figures. Accepted for publication in Modern Physics Letters

    The On-Site Analysis of the Cherenkov Telescope Array

    Get PDF
    The Cherenkov Telescope Array (CTA) observatory will be one of the largest ground-based very high-energy gamma-ray observatories. The On-Site Analysis will be the first CTA scientific analysis of data acquired from the array of telescopes, in both northern and southern sites. The On-Site Analysis will have two pipelines: the Level-A pipeline (also known as Real-Time Analysis, RTA) and the level-B one. The RTA performs data quality monitoring and must be able to issue automated alerts on variable and transient astrophysical sources within 30 seconds from the last acquired Cherenkov event that contributes to the alert, with a sensitivity not worse than the one achieved by the final pipeline by more than a factor of 3. The Level-B Analysis has a better sensitivity (not be worse than the final one by a factor of 2) and the results should be available within 10 hours from the acquisition of the data: for this reason this analysis could be performed at the end of an observation or next morning. The latency (in particular for the RTA) and the sensitivity requirements are challenging because of the large data rate, a few GByte/s. The remote connection to the CTA candidate site with a rather limited network bandwidth makes the issue of the exported data size extremely critical and prevents any kind of processing in real-time of the data outside the site of the telescopes. For these reasons the analysis will be performed on-site with infrastructures co-located with the telescopes, with limited electrical power availability and with a reduced possibility of human intervention. This means, for example, that the on-site hardware infrastructure should have low-power consumption. A substantial effort towards the optimization of high-throughput computing service is envisioned to provide hardware and software solutions with high-throughput, low-power consumption at a low-cost.Comment: In Proceedings of the 34th International Cosmic Ray Conference (ICRC2015), The Hague, The Netherlands. All CTA contributions at arXiv:1508.0589

    On the origin of radio-loudness in AGNs and its relationship with the properties of the central supermassive black hole

    Full text link
    We investigate the relationship between the mass of central supermassive black holes and the radio loudness of active galactic nuclei. We use the most recent calibrations to derive virial black hole masses for samples of radio loud QSOs for which relatively small masses (M_BH<10^8 M_sun) have been estimated in the literature. We take into account the effect of radiation pressure on the BLR which reduces the effective gravitational potential experienced by the broad-line clouds and affects the mass estimates of bright quasars. We show that in well defined samples of nearby low luminosity AGNs, QSOs and AGNs from the SDSS, radio-loud (RL) AGN invariably host SMBHs exceeding ~10^8 M_sun. On the other hand, radio-quiet (RQ) AGNs are associated with a much larger range of black hole masses. The overall result still holds even without correcting the BH mass estimates for the effects of radiation pressure. We present a conjecture based on these results, which aims at explaining the origin of radio-loudness in terms of two fundamental parameters: the spin of the black hole and the black hole mass. We speculate that in order to produce a RL AGN both of the following requirements must be satisfied: 1)the black hole mass M_BH has to be larger than ~10^8 M_sun; 2)the spin of the BH must be significant, in order to satisfy theoretical requirements. Taking into account the most recent observations, we envisage a scenario in which the merger history of the host galaxy plays a fundamental role in accounting for both the properties of the AGN and the galaxy morphology, which in our picture are strictly linked. RL sources might be obtained only through major dry mergers involving BH of large mass, which would give rise to both the core morphology and the significant black hole spin needed.Comment: 11 pages, 4 figures, accepted for publication in MNRA

    The Mediterranean ocean Forecasting System

    Get PDF
    The Mediterranean Forecasting System (MFS) is operationally working since year 2000 and it is continuously improved in the frame of international projects. The system is part of the Mediterranean Operational Oceanography Network-MOON and MFS is coordinated and operated by the Italian Group of Operational Oceanography (GNOO). The latest upgrades and integration to MFS has been undertaken in the EU-MERSEA and BOSS4GMES Projects. Since October 2005 ten days forecasts are produced daily as well as 15 days of analyses once a week. The daily forecast and weekly analysis data are available in real time to the users through a dedicated ftp service and every day a web bulletin is published on the web site (http://gnoo.bo.ingv.it/mfs). A continuous evaluation in near real time of the forecasts and analyses produced by MFS has been developed in order to continuously verify the system and to provide useful information to the users. The R&D is focused on different aspects of the system. A new basin scale ocean model nested with operational MERCATOR global model has been developed and run in real time operationally for a test period together with a new assimilation scheme based on the 3DVAR. This system is now under evaluation. Important activities have been carried out to: implement and test a Bayesian methodologies of Ensemble and Super-Ensemble for the Mediterranean sea; produce 20 years of re-analysis; re-formulate the air-sea fluxes bulk formulae; develop dedicated products to support particular request of end users such as: indicators, real time oil spill forecasting, search & rescue.EUROGOOS and European CommissionPublishedExeter, UK4.6. Oceanografia operativa per la valutazione dei rischi in aree marineope

    WARP liquid argon detector for dark matter survey

    Get PDF
    The WARP programme is a graded programme intended to search for cold Dark Matter in the form of WIMP's. These particles may produce via weak interactions nuclear recoils in the energy range 10-100 keV. A cryogenic noble liquid like argon, already used in the realization of very large detector, permits the simultaneous detection of both ionisation and scintillation induced by an interaction, suggesting the possibility of discriminating between nuclear recoils and electrons mediated events. A 2.3 litres two-phase argon detector prototype has been used to perform several tests on the proposed technique. Next step is the construction of a 100 litres sensitive volume device with potential sensitivity a factor 100 better than presently existing experiments.Comment: Proceeding of the 6th UCLA Symposium on Sources and detection of Dark Matter and dark Energy in the Univers

    Genetic determinants of complement activation in the general population

    Get PDF
    Complement is a fundamental innate immune response component. Its alterations are associated with severe systemic diseases. To illuminate the complement's genetic underpinnings, we conduct genome-wide association studies of the functional activity of the classical (CP), lectin (LP), and alternative (AP) complement pathways in the Cooperative Health Research in South Tyrol study (n&nbsp;= 4,990). We identify seven loci, encompassing 13 independent, pathway-specific variants located in or near complement genes (CFHR4, C7, C2, MBL2) and non-complement genes (PDE3A, TNXB, ABO), explaining up to 74% of complement pathways' genetic heritability and implicating long-range haplotypes associated with LP at MBL2. Two-sample Mendelian randomization analyses, supported by transcriptome- and proteome-wide colocalization, confirm known causal pathways, establish within-complement feedback loops, and implicate causality of ABO on LP and of CFHR2 and C7 on AP. LP causally influences collectin-11 and KAAG1 levels and the risk of mouth ulcers. These results build a comprehensive resource to investigate the role of complement in human health

    Cluster di Calcolo di OAS-Bologna e Software Scientifico disponibile

    Get PDF
    All’interno dell’ Osservatorio di Astrofisica e Scienza dello Spazio di Bologna (OAS), da sempre abbiamo avuto la necessità di strumenti di calcolo potenti e condivisi su cui gli scienziati possano svolgere le elaborazioni sui dati scientifici di Progetti e Missioni. Il Cluster OAS nasce proprio per soddisfare questa esigenza ed è ospitato nel Centro di Calcolo del plesso OAS presso il CNR di Bologna. Tutti gli afferenti a OAS possono chiedere di essere registrati nel sistema di autenticazione LDAP del Cluster e in questo modo accedere via Internet ai nodi di login da dove possono sottomettere le loro elaborazioni. Ci sono due modalità di usare il cluster: quella interattiva e quella batch più classica per un Cluster. L’accesso interattivo consente di lanciare in tempo reale le elaborazioni anche in maniera grafica tramite opportuni programmi si console virtuale, questo tipo di elaborazione viene svolta direttamente nei nodi di login. La modalità batch sfrutta il meccanismo a code, slurm, per sottomettere i lavori in maniera organizzata ai più potenti nodi di calcolo che non sono direttamente utilizzabili dagli utenti. Il cluster inoltre fornisce spazio di archiviazione condiviso organizzato in HOME per i dati personali degli utenti, DATA per i risultati delle elaborazioni PROGRAMMI per la memorizzazione dei moduli di elaborazione e LUSTRE per il calcolo parallelo. I vantaggi di avere un Cluster sono che gli utenti trovano i programmi e compilatori di cui necessitano già installati nelle principali versioni e hanno a disposizione una buona potenza di calcolo e spazio di storage per poter lavorare molto più agevolmente rispetto ai propri computer personali. Il Cluster OAS non ha una potenza paragonabile ai grandi Cluster Commerciali e di Ricerca, ma essendo ritagliato sulle esigenze degli afferenti a OAS risponde bene alle esigenze dell'istituto e può servire come Nave Scuola per poter poi accedere a strutture più grandi qualora fosse necessario. Nel documento saranno illustrate nel dettaglio le caratteristiche Hardware e Software del Cluster, compresi tutti i principali Software scientifici installati di cui si spiega brevemente l’utilizzo
    • …
    corecore